displayport: use correct buffer size in setOuiSource#1034
displayport: use correct buffer size in setOuiSource#1034tuxedo-aer wants to merge 1 commit intoNVIDIA:mainfrom
Conversation
Don't send more bytes than necessary to avoid problems with displays that only accept the exact amount of bytes defined in the DisplayPort standard.
|
Thanks @tuxedo-aer. We already have fixed the issue internally with correctly allocating the source OUI buffer to not write to addresses The fix is tracked in internal bug 5783114. |
That's not what the patch is doing. This patch is about avoiding writes from 30Ah to 30Fh, which are not specified in the standard. This has nothing to do with the reserved region. |
Hi, Would you mind quoting the exact DisplayPort specification version and section so we can cross reference? The change our developer made internally is the same except 30Ch to 30Fh. From the change alone internally, it quotes that 30Ah and 30Bh are used on sinks. Thanks, |
|
Sorry, I overlooked 30Ah and 30Bh which are indeed valid and only from 30Ch the reserved region starts. So yes, the internal changes you made should suffice. Thanks! |
|
Thanks, while I don't track exact releases, I will try to update here when that change has made it out if you are interested in trying it out. |
Don't send more bytes than necessary to avoid problems with displays that only accept the exact amount of bytes defined in the DisplayPort standard.
Currently, 16 bytes are allocated and sent, although only addresses from 300h to 309h are valid. The remaining bytes are zeroed, but that still seems to cause problems on some displays that won't accept the additional bytes. Therefore, reduce the buffer size to the required amount and only send the relevant bytes.
This resolves the problem for me on TUXEDO InfinityBook Max and TUXEDO Stellaris devices with 5050 and 5060 GPUs. I have no evidence that this caused any actual problems with the driver, but doing things correctly and avoiding warnings in the console is always good.